Nachdiplom Lecture: Statistics Meets Optimization 4.1 Problem Set up and Motivation

نویسنده

  • Yuting Wei
چکیده

Last time, we proved that a sketch dimension m % 1 δ2W (AK(xLS) ∩ Sn−1) is sufficient to ensure this property. A related but slightly different notion of approximation is that of solution approximation, in which we measure the quality in terms of some norm between x̂ and xLS. Defining the (semi)-norm ‖u‖A : = √ uTATAu/n, let us say that x̂ is a δ-accurate solution approximation if ‖xLS − x̂‖A ≤ δ. (4.4)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nachdiplom Lecture : Statistics meets Optimization Fall 2015 Lecture 1 – Tuesday , September 22

In the modern era of high-dimensional data, the interface of mathematical statistics and optimization has become an increasing vibrant area of research. The goal of these lectures is to touch on various evolving areas at this interface. Before going into the details proper, let’s consider some high-level ways in which the objectives of optimization can be influenced by underlying statistical ob...

متن کامل

Nachdiplom Lecture: Statistics Meets Optimization 3.1 Problem Set up and Motivation

Today, we analyze an application of random projection to compute approximate solutions of constrained least-squares problems. This method is often referred to as sketched least-squares. Suppose that we are given an observation vector y ∈ R n and matrix A ∈ R n×d , and that for some convex set C ⊂ R d , we would like to compute the constrained least-squares solution x LS : = argmin x∈C 1 2 y − A...

متن کامل

Nachdiplom Lecture: Statistics Meets Optimization Lecture 2 – Tuesday, September 29 2.1 a General Theorem on Gaussian Random Projections

Let K be a subset of the Euclidean sphere S d−1. As seen in Lecture #1, in analyzing how well a given random projection matrix S ∈ R m×d preserves vectors in K, a central object is the random variable Z(K) = sup u∈K Su 2 2 m − 1. (2.1) Suppose that our goal is to establish that, for some δ ∈ (0, 1), we have Z(K) ≤ δ with high probability. How large must the projection dimension m be, as a funct...

متن کامل

Nachdiplom Lecture: Statistics Meets Optimization Some Comments on Relative Costs

Directly solving the ordinary least squares problem will (in general) require O(nd) operations. From Table 5.1, the Gaussian sketch does not actually improve upon this scaling for unconstrained problems: when m d (as is needed in the unconstrained case), then computing the sketch SA requires O(nd) operations as well. If we compute sketches using the JLT, then this cost is reduced to O(nd log(d)...

متن کامل

Nachdiplom Lecture : Statistics meets Optimization Fall 2015 Lecture 9 – Tuesday , December 1

A random vector xi ∈ R is said to be drawn from a spiked covariance model if it can written in the form xi = F ∗ √ Γξi + wi where F ∗ ∈ Rd×r is a fixed matrix with orthonormal columns; Γ = diagγ1, . . . , γr} is a diagonal matrix with γ1 ≥ γ2 ≥ · · · ≥ γr > 0; ξi ∈ R is a zero-mean random vector with identity covariance, and wi is a zero-mean random vector, independent of ξi, and with identity ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015